Goto

Collaborating Authors

 universal style transfer


Universal Style Transfer via Feature Transforms

Neural Information Processing Systems

Universal style transfer aims to transfer arbitrary visual styles to content images. Existing feed-forward based methods, while enjoying the inference efficiency, are mainly limited by inability of generalizing to unseen styles or compromised visual quality. In this paper, we present a simple yet effective method that tackles these limitations without training on any pre-defined styles. The key ingredient of our method is a pair of feature transforms, whitening and coloring, that are embedded to an image reconstruction network. The whitening and coloring transforms reflect direct matching of feature covariance of the content image to a given style image, which shares similar spirits with the optimization of Gram matrix based cost in neural style transfer. We demonstrate the effectiveness of our algorithm by generating high-quality stylized images with comparisons to a number of recent methods. We also analyze our method by visualizing the whitened features and synthesizing textures by simple feature coloring.


Reviews: Universal Style Transfer via Feature Transforms

Neural Information Processing Systems

The paper presents a very simple idea for performing style transfer: simply use a normal auto-encoder, whiten the features of the content image and color them with the statistics from the features of the style image. It's well written, the idea are is presented clearly and the evaluation is as good as can be expected on style transfer. I know a few works in this area, but I'm not following precisely the latest developments, and there has been a very high number of paper on this topic in the last year, so it is hard for me to evaluate the novelty and interest of this paper. Technically, it seems to me a marginal improvement over [14]. In term of results, I find style transfer work extremely difficult to evaluate and this paper was not an exception: it's clearly close to the state of the art, it's not clear to me it's really better than it's competitors.


Universal Style Transfer via Feature Transforms

Li, Yijun, Fang, Chen, Yang, Jimei, Wang, Zhaowen, Lu, Xin, Yang, Ming-Hsuan

Neural Information Processing Systems

Universal style transfer aims to transfer arbitrary visual styles to content images. Existing feed-forward based methods, while enjoying the inference efficiency, are mainly limited by inability of generalizing to unseen styles or compromised visual quality. In this paper, we present a simple yet effective method that tackles these limitations without training on any pre-defined styles. The key ingredient of our method is a pair of feature transforms, whitening and coloring, that are embedded to an image reconstruction network. The whitening and coloring transforms reflect direct matching of feature covariance of the content image to a given style image, which shares similar spirits with the optimization of Gram matrix based cost in neural style transfer.